Hi Guys,I have been playing around with the new Motion Capture stuff in ARKit 3 and there are couple things I don't see in the documentation and am hoping people can help fill in the gaps for me.It looks like the jointModelTransform returns a 4X4 matrix. What is in this? I see in the "Capturing Body Motion in 3D" sample code that it looks like cloumn 3 has the 3d coordinates but what are the other items? (https://developer.apple.com/documentation/arkit/arskeleton3d/3295992-jointmodeltransforms)Is there a confidence metric for joint identification that is available somewhere within ARSkelton3d?
Post
Replies
Boosts
Views
Activity
Hi All,I have been using test flight to collect feedback, mostly screenshots, via test flight and I can see it all in the app store connect web utility. Is there a way to download all of this feedback at once, including screenshots & json files?Even better if there is a way to automate the creation of bug tickets in JIRA.
Hi All, Is there any way to record an AR Session with reality composer 1.4 that incldues lidar data? I used this recording method to record test captures that I can use to playback in xCode for development/testing while making my app. When I go to developer section in reality composer the record AR Session option is greyed out on the new iPad. Any workarounds or info about when this feature may be possible in reality composer?
Is it possible to use a HomeKit camera accessory to provide a video stream for a general purpose object recognition app using the vision framework? I see the HomeKit cameraView (https://developer.apple.com/documentation/homekit/cameraview) documentation but that appears to only allow for display of a stream within your app but doesn't really let you do much else with it.